88 research outputs found
The price of certainty: "waterslide curves" and the gap to capacity
The classical problem of reliable point-to-point digital communication is to
achieve a low probability of error while keeping the rate high and the total
power consumption small. Traditional information-theoretic analysis uses
`waterfall' curves to convey the revolutionary idea that unboundedly low
probabilities of bit-error are attainable using only finite transmit power.
However, practitioners have long observed that the decoder complexity, and
hence the total power consumption, goes up when attempting to use sophisticated
codes that operate close to the waterfall curve.
This paper gives an explicit model for power consumption at an idealized
decoder that allows for extreme parallelism in implementation. The decoder
architecture is in the spirit of message passing and iterative decoding for
sparse-graph codes. Generalized sphere-packing arguments are used to derive
lower bounds on the decoding power needed for any possible code given only the
gap from the Shannon limit and the desired probability of error. As the gap
goes to zero, the energy per bit spent in decoding is shown to go to infinity.
This suggests that to optimize total power, the transmitter should operate at a
power that is strictly above the minimum demanded by the Shannon capacity.
The lower bound is plotted to show an unavoidable tradeoff between the
average bit-error probability and the total power used in transmission and
decoding. In the spirit of conventional waterfall curves, we call these
`waterslide' curves.Comment: 37 pages, 13 figures. Submitted to IEEE Transactions on Information
Theory. This version corrects a subtle bug in the proofs of the original
submission and improves the bounds significantl
Information Flow in Computational Systems
We develop a theoretical framework for defining and identifying flows of
information in computational systems. Here, a computational system is assumed
to be a directed graph, with "clocked" nodes that send transmissions to each
other along the edges of the graph at discrete points in time. We are
interested in a definition that captures the dynamic flow of information about
a specific message, and which guarantees an unbroken "information path" between
appropriately defined inputs and outputs in the directed graph. Prior measures,
including those based on Granger Causality and Directed Information, fail to
provide clear assumptions and guarantees about when they correctly reflect
information flow about a message. We take a systematic approach---iterating
through candidate definitions and counterexamples---to arrive at a definition
for information flow that is based on conditional mutual information, and which
satisfies desirable properties, including the existence of information paths.
Finally, we describe how information flow might be detected in a noiseless
setting, and provide an algorithm to identify information paths on the
time-unrolled graph of a computational system.Comment: Significantly revised version which was accepted for publication at
the IEEE Transactions on Information Theor
- …